Once again June brings us the Design AutomationConference. But what a difference a DAC makes.
Last year 's DAC came on the wings of unending pros-
perity. We were confidently assured that the revolu-
tions of Web and wireless had extinguished the
business cycle. We would see growth - moderate,ex-
citing growth or explosive, gut-wrenching growth,but
growth only.
Ah,well.
Now we are engaged in paying the wages of that
optimism. For a minority - but a far too large minor-
ity - of engineers, the invoice will come on a pink slip.
For many more,the wages will be met with deterio-
rated options, lost opportunities or the embarrass-
ingly quiet disappearance of friends and associates.
We at ISD have not been immune.
There is another aspect to our economic pause,
however, that directly concerns many of us.It is eco-
nomic in origin but purely technical in its implications.
That aspect begins with the reasons for the abrupt
slowdown that seemed to simultaneously strike dot-
coms,computers,networking and wireless. All of
those markets plateaued at about the same time. End
users grew listless, inventories piled up, and the rest
is recent history.
Now when a market reaches a plateau,buying
slows down and the pattern of buying changes. Buy-
ers become indifferent to evolutionary feature creep.
They have trouble seeing beyond the price tag.
Add to that the need for OEMs to get rid of a sub-
stantial inventory of already-built stuff,and you get a
whole new set of marching orders coming to engi-
neering. Instead of turning a sort-of new design every
few months,the direction now, in many organizations,
is to step back, take a deep breath and start on a new-
generation design.
The next generation will mean different things for
different design teams. For some it will mean radically
higher integration. For others it will involve moving
to a new process node,with the concomitant up-
rooting of the tool chain. For still others,it will mean
going back to fix some of the methodological prob-
lems that cropped up - and were promptly hammered
down - in previous designs.Verification comes to mind.
Design teams are weighing several specific quan-
tum steps to ake in he ensuing months. For intense
FPGA users, it may be the opportunity to tackle the
methodology for the new big, CPU-integrating parts.
How do you floor plan such things,and how do you
balance the roles of synthesis and mapping to get
predictable results?
For teams moving on to advanced geometries -
and yes,people are seeing parts at 0.13 and below
now - there are other issues. Everyone seems to
agree that you can reduce iterations by combining -
that is,sharing data between - multiple elements of
the tool chain. But which ones? Floor-planning and
synthesis? Synthesis and coarse placement?Place-
ment and routing? Wait until there is a single all-in-
one tool?
And what about verification? Where does formal
verification,with its reassuring name but its endency
toward complexity and its tiny appetite for block size,
really fit? How about random vectors? Is anyone smart
enough to guide exploration anymore? With multiple
asynchronous clock domains,gated clocks and blind-
ingly fast external busses, can we even call throwing
test vectors at a design “verification” anymore??
The industry pause may give us time to take a hard
look at these questions. We at ISD intend to do our
share of asking,bringing you the best papers we can
find. To that end,we are augmenting our editorial
group. Nic Mokhoff,longtime CMP editorial guru,is
joining ISD as editor. I will remain editorial director.
So we will be seeing you in Las Vegas. Now is a
great time to ask the hard questions,to have the in-
formed debates and to use the professional tools of
experiment,report,peer review and analysis. Go for
it. Here's to a rollicking good DAC!